Tensor network method for reversible classical computation

نویسندگان

  • Zhi-Cheng Yang
  • Stefanos Kourtis
  • Claudio Chamon
  • Eduardo R. Mucciolo
  • Andrei E. Ruckenstein
چکیده

We develop a tensor network technique that can solve universal reversible classical computational problems, formulated as vertex models on a square lattice [Nat. Commun. 8, 15303 (2017)]. By encoding the truth table of each vertex constraint in a tensor, the total number of solutions compatible with partial inputs/outputs at the boundary can be represented as the full contraction of a tensor network. We introduce an iterative compression-decimation (ICD) scheme that performs this contraction efficiently. The ICD algorithm first propagates local constraints to longer ranges via repeated contraction-decomposition sweeps over all lattice bonds, thus achieving compression on a given length scale. It then decimates the lattice via coarse-graining tensor contractions. Repeated iterations of these two steps gradually collapse the tensor network and ultimately yield the exact tensor trace for large systems, without the need for manual control of tensor dimensions. Our protocol allows us to obtain the exact number of solutions for computations where a naive enumeration would take astronomically long times.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

2 Reversible Quantum Computation

It has long been known that to minimise the heat emitted by a determin-istic computer during it's operation it is necessary to make the computation act in a logically reversible manner[Lan61]. Such logically reversible operations require a number of auxiliary bits to be stored, maintaining a history of the computation, and which allows the initial state to be reconstructed by running the comput...

متن کامل

Robust Full Bayesian Learning for Radial Basis Networks

We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model dimension (number of neurons), model parameters, regularization parameters, and noise parameters as unknown random variables. We develop a reversible-jump Markov chain Monte Carlo (MCMC) method to perform the Bayesian computation. We find that the results obtained using this method are not only ...

متن کامل

Quantum Computation and the Evaluation of Tensor Networks

We present a quantum algorithm that additively approximates the value of a tensor network to a certain scale. When combined with existing results, this provides a complete problem for quantum computation. The result is a simple new way of looking at quantum computation in which unitary gates are replaced by tensors and time is replaced by the order in which the tensor-network is " swallowed ". ...

متن کامل

Neuron Mathematical Model Representation of Neural Tensor Network for RDF Knowledge Base Completion

In this paper, a state-of-the-art neuron mathematical model of neural tensor network (NTN) is proposed to RDF knowledge base completion problem. One of the difficulties with the parameter of the network is that representation of its neuron mathematical model is not possible. For this reason, a new representation of this network is suggested that solves this difficulty. In the representation, th...

متن کامل

Learning Curve Consideration in Makespan Computation Using Artificial Neural Network Approach

This paper presents an alternative method using artificial neural network (ANN) to develop a scheduling scheme which is used to determine the makespan or cycle time of a group of jobs going through a series of stages or workstations. The common conventional method uses mathematical programming techniques and presented in Gantt charts forms. The contribution of this paper is in three fold. First...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017